Web Survey Bibliography
Since the early stages of public opinion research, nonresponse has been identified as an important threat to the degree to which our sample can represent the population we are interested in. Researchers have documented a trend of declining response rate over the years. However, the nonresponse rate becomes a concern only when it introduces error or bias into survey results. One way to estimate nonresponse bias is through imputation. Online panels, which maintain a pool of respondents who are invited to participate in research through electronic means, face unique opportunities as well as challenges with regards to nonresponses and their imputations. Using data from a nation-wide online panel, this paper hypothesizes that nonresponse bias may exist due to the common causes shared between response propensity and opinion placements. After testifying the common causes, imputations are made to estimate the missing values. Lastly, the differences between observed distributions on variables of interest and imputed distributions are made to show the scope of nonresponse biases. This paper finds that nonresponse biases may exist in online panels. First, the theoretical model of nonresponse bias was supported because the common-cause pattern was found in the dataset. In other words, response propensity and opinion items that are of interest appeared to share common causes including mostly demographic variables. Second, imputation analyses show that although most of the differences between imputed and measured opinions do not indicate serious biases, there were few cases in which the differences seemed to be critical. The limitations of this study, especially those of the imputation method, are discussed at the end of this chapter. Suggestions for future research are provided too.
Web survey bibliography - In M. Callegaro, R. Baker, J. Bethlehem, A. S. Göritz, J. A. Krosnick and P. J. Lavrakas (eds.): Online Panel Research: A Data Quality Perspective. John Wiley & Sons, Ltd, Chichester, UK (15)
- Validating respondents' identity in online samples; 2014; Baker, R., Miller, C., Kachhi-Jiwani, D., Lange, K., Wilding-Brown, L., Tucker, J.
- The relationship between nonresponse strategies and measurement error; 2014; Malhotra, N., Miller, J. M., Wedeking, J.
- Nonresponse and measurement error in an online panel; 2014; Roberts, C., Allum, N., Sturgis, P.
- Estimating the effects of nonresponses in online panels through imputation; 2014; Zhang, W.
- An empirical test of the impact of smartphones on panel-based online data collection; 2014; Drewes, F.
- Professional respondents in nonprobability online panels; 2014; Hillygus, D. S., Jackson, N. M., Young, M.
- Informing panel members about study results; 2014; Scherpenzeel, A., Toepoel, V.
- Determinants of the starting rate and the completion rate in online panel studies; 2014; Goeritz, A.
- The untold story of multi-mode (online and mail) consumer panels; 2014; McCutcheon, A. L., Rao, K., Kaminska, O.
- Online panels and validity; 2014; Groenlund, K., Strandberg, K.
- Assessing representativeness of a probability-based online panel in Germany; 2014; Struminskaya, B., Kaczmirek, L., Schaurer, I., Bandilla, W.
- A critical review of studies investigating the quality of data obtained with online panels based on...; 2014; Callegaro, M., Villar, A., Yeager, D. S., Krosnick, J. A.
- Online panel research: History, concepts, applications and a look at the future; 2014; Callegaro, M., Baker, R., Bethlehem, J., Goeritz, A., Krosnick, J. A., Lavrakas, P. J.
- Motives for joining nonprobability online panels and their association with survey participation behavior...; 2014; Keusch, F., Batinic, B., Mayerhofer, W.
- Improving web survey quality; 2014; Steinmetz, S., Bianchi, S. M., Tijdens, K. G., Biffignandi, S.